How to Prepare for AWS Data Analytics Specialty Certification? AWS DAS - C01 Certification Made Easy with VMExam.com. AWS DAS - C01 Exam Details Exam Code DAS - C01 Full Exam Name AWS Certified Data Analytics - Specialty No. of Questions 65 Online Practice Exam AWS Certified Data Analytics - Specialty Practice Test Sample Questions AWS DAS - C01 Sample Questions Passing Score 750 / 1000 Time Limit 180 minutes Exam Fees $300 USD Enjoy the success with VMExam.com How to Prepare for AWS DAS - C01? • Perform enough practice with with related Data Analytics Specialty certification on VMExam.com. • Understand the all Exam Topics very well. • Identify your weak areas from practice test and do more practice with VMExam.com. Enjoy the success with VMExam.com DAS - C01 Certification Exam Topics Syllabus Topics Weight ● Collection 18% ● Storage and Data Management 22% ● Processing 24% ● Analysis and Visualization 18% ● Security 18% Enjoy the success with VMExam.com DAS - C01 Certification Training Training: ● Data Analytics Fundamentals ● Big Data on AWS Enjoy the success with VMExam.com AWS DAS - C01 Sample Questions Enjoy the success with VMExam.com Que 01 : A company is currently using Amazon DynamoDB as the database for a user support application The company is developing a new version of the application that will store a PDF file for each support case ranging in size from 1 – 10 MB The file should be retrievable whenever the case is accessed in the application How can the company store the file in the MOST cost - effective manner? Options : a) Store the file in Amazon DocumentDB and the document ID as an attribute in the DynamoDB table b) Store the file in Amazon S 3 and the object key as an attribute in the DynamoDB table c) Split the file into smaller parts and store the parts as multiple items in a separate DynamoDB table d) Store the file as an attribute in the DynamoDB table using Base 64 encoding Enjoy the success with VMExam.com Answer b) Store the file in Amazon S3 and the object key as an attribute in the DynamoDB table. Enjoy the success with VMExam.com Que 02 : A real estate company is receiving new property listing data from its agents through csv files every day and storing these files in Amazon S 3 The data analytics team created an Amazon QuickSight visualization report that uses a dataset imported from the S 3 files The data analytics team wants the visualization report to reflect the current data up to the previous day How can a data analyst meet these requirements? Options : a) Schedule an AWS Lambda function to drop and re - create the dataset daily b) Configure the visualization to query the data in Amazon S 3 directly without loading the data into SPICE c) Schedule the dataset to refresh daily d) Close and open the Amazon QuickSight visualization Enjoy the success with VMExam.com Answer c) Schedule the dataset to refresh daily. Enjoy the success with VMExam.com Que 03 : A financial company uses Amazon EMR for its analytics workloads During the company’s annual security audit, the security team determined that none of the EMR clusters’ root volumes are encrypted The security team recommends the company encrypt its EMR clusters’ root volume as soon as possible Which solution would meet these requirements? Options : a) Enable at - rest encryption for EMR File System (EMRFS) data in Amazon S 3 in a security configuration Re - create the cluster using the newly created security configuration b) Specify local disk encryption in a security configuration Re - create the cluster using the newly created security configuration c) Detach the Amazon EBS volumes from the master node Encrypt the EBS volume and attach it back to the master node d) Re - create the EMR cluster with LZO encryption enabled on all volumes Enjoy the success with VMExam.com Answer b) Specify local disk encryption in a security configuration. Re - create the cluster using the newly created security configuration. Enjoy the success with VMExam.com Que 03 : An online retail company wants to perform analytics on data in large Amazon S 3 objects using Amazon EMR An Apache Spark job repeatedly queries the same data to populate an analytics dashboard The analytics team wants to minimize the time to load the data and create the dashboard Which approaches could improve the performance? (Select TWO ) Options : a) Copy the source data into Amazon Redshift and rewrite the Apache Spark code to create analytical reports by querying Amazon Redshift b) Copy the source data from Amazon S 3 into Hadoop Distributed File System (HDFS) using s 3 distcp c) Load the data into Spark DataFrames d) Stream the data into Amazon Kinesis and use the Kinesis Connector Library (KCL) in multiple Spark jobs to perform analytical jobs e) Use Amazon S 3 Select to retrieve the data necessary for the dashboards from the S 3 objects Enjoy the success with VMExam.com Answer c) Load the data into Spark DataFrames. e) Use Amazon S3 Select to retrieve the data necessary for the dashboards from the S3 objects. Enjoy the success with VMExam.com Que 03 : A media company is migrating its on - premises legacy Hadoop cluster with its associated data processing scripts and workflow to an Amazon EMR environment running the latest Hadoop release The developers want to reuse the Java code that was written for data processing jobs for the on - premises cluster Which approach meets these requirements? Options : a) Deploy the existing Oracle Java Archive as a custom bootstrap action and run the job on the EMR cluster b) Compile the Java program for the desired Hadoop version and run it using a CUSTOM_JAR step on the EMR cluster c) Submit the Java program as an Apache Hive or Apache Spark step for the EMR cluster d) Use SSH to connect the master node of the EMR cluster and submit the Java program using the AWS CLI Enjoy the success with VMExam.com Answer b) Compile the Java program for the desired Hadoop version and run it using a CUSTOM_JAR step on the EMR cluster. Enjoy the success with VMExam.com Follow Us Enjoy the success with VMExam.com